Building Memory-Augmented AI Architectures

Research Overview

We design memory-augmented architectural layers for personalised and long-lived human-centric AI systems. 

The objective is to develop AI systems that convert user input into personalized knowledge representations that are portable across computational environments, persistent over time, and privately maintained at individual level, while becoming progressively more data-efficient—requiring fewer generic prompts and less new information over time as structured understanding accumulates.

1. Temporal Memory Architecture

Memory is organised across time: what stays in the working context, what is summarised after each interactive session, and what becomes durable knowledge.

This is conceptually aligned with hippocampal–cortical consolidation — fast encoding followed by gradual stabilisation.


1.1 Short-term — Working Context

Session-level information used directly during interaction for high-fidelity reasoning.

  • Context window grounding

  • High precision, non-persistent

  • Optimised for real-time continuity

1.2 Medium-term — Session Summary

Structured compression of each interactive session for scalable recall across time.

  • Compression for long-horizon scale

  • Stable cross-session recall

  • Bridge into consolidation

1.3 Long-term — Persistent Knowledge

Durable memory of preferences, facts, and stable relationships that can evolve over time.

  • Queryable and updateable

  • Supports long-lived personalisation

  • Designed for continuous evolution

Consolidation & Reconsolidation is the core mechanism

Through consolidation and reconsolidation, memory systems determine which traces are stabilized, which are updated upon reactivation, and which progressively weaken or decay.

It is the bridge between interaction history and durable memory structure.


2. Structural Memory Representation

We treat long-term memory as an evolving representation, not a static log.

Our system combines semantic structure (graph) with episodic traces (vector retrieval), supported by conflict-aware updates.


2.1 Semantic — Graph-based Memory

A knowledge graph of entities, attributes, relations, provenance, and evidence.

  • Structured reasoning and querying

  • Provenance and evidence links

  • Stable long-horizon consistency

2.2 Episodic — Vector-based Memory

Context-rich traces for fuzzy recall, similarity matching, and nuanced retrieval.

  • High recall for soft memories

  • Similarity-based retrieval

  • Complements the knowledge graph

2.3 Safety and Stability — Privacy, Conflict, Confidence, Versioning

Updates are cautious: conflicts trigger confidence changes and versioned states when needed.

  • Conflict detection across sessions

  • Confidence tracking and recency

  • Version control for memory evolution

Episodic Buffer to Semantic Consolidation

Between raw interactions and long-term knowledge sits an episodic buffer that aggregates evidence, detects conflicts, and consolidates only what is supported into structured semantic memory.


3. System-level Memory Capabilities

Beyond storage, we study what memory enables at a system level: portability, separation by context, and cross-modal consolidation.

This is where our unified multi-modal direction sits, including collaboration with partners.

3.1 Portability — Portable Memory

Exploring representations and protocols that allow memory to move across products, business contexts, and heterogeneous agent ecosystems.

  • Interoperable memory formats

  • Policy-aware transfer

  • Ownership and governance by design

  • Extensible to third-party agents and skill systems

3.2 Separation — Splittable Memory

Context-aware activation that separates work, personal, and other domains without leakage.

  • Selective memory projection

  • Role/persona separation

  • Reduced privacy bleed

3.3 Multimodal — Cross-modal Consolidation

Unifying signals across text and other modalities into shared memory objects with evidence and confidence.

  • Unified memory objects across modalities

  • Cross-modal reinforcement for robustness

  • Conflict-aware multi-modal updating

Unified Multi-modal Memory Architecture

Cross-modal agreement strengthens confidence and consolidation.

Conflicts preserve uncertainty via versioning and drive further evidence gathering.


Outcome

Temporal architecture, structural representation, and system-level capabilities together form a persistent and evolving memory substrate.

Over time, this supports consistent long-term identity grounded in consolidated and personalised knowledge rather than transient interaction history.


Interested in collaboration?

We work with research partners, enterprise teams, and product builders on:

  • Memory evaluation

  • Consolidation mechanisms

  • Unified multi-modal memory systems

Contact us to explore research collaborations.